Coordinate Descent Method for Large-scale L2-loss Linear SVM
نویسندگان
چکیده
Linear support vector machines (SVM) are useful for classifying largescale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem while fixing other variables. The sub-problem is solved by Newton steps with the line search technique. The procedure globally converges at the linear rate. Experiments show that our method is more efficient and stable than state of the art methods such as Pegasos and TRON.
منابع مشابه
Coordinate Descent Method for Large-scale L2-loss Linear Support Vector Machines
Linear support vector machines (SVM) are useful for classifying large-scale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem w...
متن کاملSupplementary materials for "Parallel Dual Coordinate Descent Method for Large-scale Linear Classification in Multi-core Environments"
f(α) ≡ g(Eα) + bα, f(·) and g(·) are proper closed functions, E is a constant matrix, and Li ∈ [−∞,∞), Ui ∈ (−∞,∞] are lower/upper bounds. It has been checked in [1] that l1 and l2 loss SVM are in the form of (I.1) and satisfy additional assumptions needed in [4]. We introduce an important class of gradient-based scheme for CD’s variable selection: the Gauss-Southwell rule. It plays an importan...
متن کاملIndexed Learning for Large-Scale Linear Classification
Linear classification has achieved complexity linear to the data size. However, in many applications, large-scale data contains only a few samples that can improve the target objective. In this paper, we propose a sublinear-time algorithm that uses Nearest-Neighbor-based Coordinate Descent method to solve Linear SVM with truncated loss. In particular, we propose a sequential relaxation that sol...
متن کاملPascal Challenge: Linear Support Vector Machines
ξ(w; xi, yi) = max(1− yi(w xi), 0), We consider the LIBLINEAR package (available at http://www.csie.ntu.edu.tw/~cjlin/ liblinear), which can handle L1and L2-loss linear SVMs. The L1-SVM solver implemented in LIBLINEAR employes a coordinate descent method to solve the dual problem. Details are in [1]. This method is very useful for large sparse data with a huge number of instances and features. ...
متن کاملA coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
Support vector machines (SVMs) training may be posed as a large quadratic program (QP) with bound constraints and a single linear equality constraint. We propose a (block) coordinate gradient descent method for solving this problem and, more generally, linearly constrained smooth optimization. Our method is closely related to decomposition methods currently popular for SVM training. We establis...
متن کامل